593 research outputs found

    On the convergence analysis of DCA

    Full text link
    In this paper, we propose a clean and general proof framework to establish the convergence analysis of the Difference-of-Convex (DC) programming algorithm (DCA) for both standard DC program and convex constrained DC program. We first discuss suitable assumptions for the well-definiteness of DCA. Then, we focus on the convergence analysis of DCA, in particular, the global convergence of the sequence {xk}\{x^k\} generated by DCA under the Lojasiewicz subgradient inequality and the Kurdyka-Lojasiewicz property respectively. Moreover, the convergence rate for the sequences {f(xk)}\{f(x^k)\} and {∥xk−x∗∥}\{\|x^k-x^*\|\} are also investigated. We hope that the proof framework presented in this article will be a useful tool to conveniently establish the convergence analysis for many variants of DCA and new DCA-type algorithms

    An Accelerated DC Programming Approach with Exact Line Search for The Symmetric Eigenvalue Complementarity Problem

    Full text link
    In this paper, we are interested in developing an accelerated Difference-of-Convex (DC) programming algorithm based on the exact line search for efficiently solving the Symmetric Eigenvalue Complementarity Problem (SEiCP) and Symmetric Quadratic Eigenvalue Complementarity Problem (SQEiCP). We first proved that any SEiCP is equivalent to SEiCP with symmetric positive definite matrices only. Then, we established DC programming formulations for two equivalent formulations of SEiCP (namely, the logarithmic formulation and the quadratic formulation), and proposed the accelerated DC algorithm (BDCA) by combining the classical DCA with inexpensive exact line search by finding real roots of a binomial for acceleration. We demonstrated the equivalence between SQEiCP and SEiCP, and extended BDCA to SQEiCP. Numerical simulations of the proposed BDCA and DCA against KNITRO, FILTERED and MATLAB FMINCON for SEiCP and SQEiCP on both synthetic datasets and Matrix Market NEP Repository are reported. BDCA demonstrated dramatic acceleration to the convergence of DCA to get better numerical solutions, and outperformed KNITRO, FILTERED, and FMINCON solvers in terms of the average CPU time and average solution precision, especially for large-scale cases.Comment: 24 page

    Higher-order Moment Portfolio Optimization via The Difference-of-Convex Programming and Sums-of-Squares

    Full text link
    We are interested in developing a Difference-of-Convex (DC) programming approach based on Difference-of-Convex-Sums-of-Squares (DC-SOS) decomposition techniques for high-order moment (Mean-Variance-Skewness-Kurtosis) portfolio optimization model. This problem can be formulated as a nonconvex quartic multivariate polynomial optimization, then a DC programming formulation based on the recently developed DC-SOS decomposition is investigated. We can use a well-known DC algorithm, namely DCA, for its numerical solution. Moreover, an acceleration technique for DCA, namely Boosted-DCA (BDCA), based on an inexact line search (Armijo-type line search) to accelerate the convergence of DCA for smooth and nonsmooth DC program with convex constraints is proposed. This technique is applied to DCA based on DC-SOS decomposition, and DCA based on universal DC decomposition. Numerical simulations of DCA and Boosted-DCA on synthetic and real datasets are reported. Comparisons with some non-dc programming based optimization solvers (KNITRO, FILTERSD, IPOPT and MATLAB fmincon) demonstrate that our Boosted-DC algorithms can achieve same numerical results with good performance comparable to these efficient methods on solving the high-order moment portfolio optimization model.Comment: 42 pages, 13 figure
    • …
    corecore